Bayesian Posterior Repartitioning for Nested Sampling

نویسندگان

چکیده

Priors in Bayesian analyses often encode informative domain knowledge that can be useful making the inference process more efficient. Occasionally, however, priors may unrepresentative of parameter values for a given dataset, which result inefficient space exploration, or even incorrect inferences, particularly nested sampling (NS) algorithms. Simply broadening prior such cases inappropriate impossible some applications. Hence our previous solution to this problem, known as posterior repartitioning (PR), redefines and likelihood while keeping their product fixed, so inferences evidence estimates remain unchanged, but efficiency NS is significantly increased. In its most practical form, PR raises power β, introduced an auxiliary variable must determined on case-by-case basis, usually by lowering β from unity according pre-defined ‘annealing schedule’ until resulting converge consistent solution. Here we present very simple yet powerful alternative approach, instead treated hyperparameter inferred data alongside original parameters then marginalised over obtain final inference. We show through numerical examples (BPR) method provides robust, self-adapting computationally efficient ‘hands-off’ problem using NS. Moreover, unlike method, representative BPR has negligible computational overhead relative standard nesting sampling, suggests it should used default all analyses.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Discussion of Nested Sampling for Bayesian Computations

We comment on several aspects of Skilling's paper presented at Valencia 8. In particular we prove the convergence in probability of the algorithm for a wide class of situations, comment on its potential utility and discuss aspects where further work is needed to assess the approach.

متن کامل

Nested Sampling for General Bayesian Computation

Nested sampling estimates directly how the likelihood function relates to prior mass. The evidence (alternatively the marginal likelihood, marginal density of the data, or the prior predictive) is immediately obtained by summation. It is the prime result of the computation, and is accompanied by an estimate of numerical uncertainty. Samples from the posterior distribution are an optional byprod...

متن کامل

Approximate Slice Sampling for Bayesian Posterior Inference

In this paper, we advance the theory of large scale Bayesian posterior inference by introducing a new approximate slice sampler that uses only small mini-batches of data in every iteration. While this introduces a bias in the stationary distribution, the computational savings allow us to draw more samples in a given amount of time and reduce sampling variance. We empirically verify on three dif...

متن کامل

Approximate Slice Sampling for Bayesian Posterior Inference

In this paper, we advance the theory of large scale Bayesian posterior inference by introducing a new approximate slice sampler that uses only small mini-batches of data in every iteration. While this introduces a bias in the stationary distribution, the computational savings allow us to draw more samples in a given amount of time and reduce sampling variance. We empirically verify on three dif...

متن کامل

Discussion of Nested Sampling for Bayesian Computations by John Skilling

The ingredients to a Bayesian analysis comprise the sampling model {Pθ : θ ∈ Ω} for the response X, the prior Π for the parameter θ, and the observed value X = x. This is equivalent to specifying the joint model Pθ ×Π for (X, θ) and the observed value X = x. In many situations we may also have a loss function but we ignore this here as it is not material to our discussion. We refer to Pθ ×Π as ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Bayesian Analysis

سال: 2022

ISSN: ['1936-0975', '1931-6690']

DOI: https://doi.org/10.1214/22-ba1323